《计算机应用》唯一官方网站

• •    下一篇

基于交替注意力机制和图卷积网络的方面级情感分析模型

杨先凤1,汤依磊1,李自强2   

  1. 1.西南石油大学 计算机科学学院,成都 610500; 2. 四川师范大学 影视与传媒学院,成都 610066
  • 收稿日期:2023-04-27 修回日期:2023-06-13 接受日期:2023-06-30 发布日期:2023-12-04 出版日期:2023-12-04
  • 通讯作者: 杨先凤
  • 基金资助:
    四川省科技厅重点研发项目;国家自然科学基金资助项目

Aspect-level sentiment analysis model based on alternating attention mechanism and graph convolutional network

  • Received:2023-04-27 Revised:2023-06-13 Accepted:2023-06-30 Online:2023-12-04 Published:2023-12-04

摘要: 方面级情感分析旨在预测给定文本中特定目标的情感极性。针对忽略方面词和上下文之间的句法关系和由平均池化带来注意力差异性变小的问题,提出一种基于交替注意力和图卷积网络(GCN)的方面级情感分析模型(AA-GCN)。首先,利用双向长短记忆(Bi-LSTM)网络对上下文和方面词进行语义建模;其次,通过基于句法依存树的GCN学习位置信息和依赖关系,再利用交替注意力(Alternating-Attention)机制进行多层次交互学习,自适应地调整对目标词的关注度;最后,拼接修正后的方面特征和上下文特征得到最终的分类依据。相较于基于目标依赖的图注意力网络(TD-GAT),所提模型在4个数据集上准确率分别提升了2.67%、1.13%、1.27%和1.84%,在5个数据集上F1值分别提升了2.82%、0.98%、4.56%、4.89%和4.46%,验证了利用句法关系、提升关键词关注度的有效性。

关键词: 自然语言处理, 深度学习, 方面级情感分析, 交替注意力机制, 图卷积网络

Abstract: Aspect-level sentiment analysis aimed to predict the sentiment polarity of specific targets in given text. Aiming at the problem of ignoring the syntactic relationship between aspect words and context and reducing the attention difference caused by average pooling, an aspect-level sentiment analysis model based on Alternating Attention and Graph Convolutional Network (AA-GCN) was proposed. Firstly, the Bidirectional Long Short-Term Memory (Bi-LSTM) network was used to semantically model context and aspect words. Secondly, the GCN based on syntactic dependency tree was used to learn location information and dependencies, and the Alternating-Attention mechanism was used for multi-level interactive learning to adaptively adjust the attention to the target word. Finally, the final classification basis was obtained by splicing the corrected aspect features and context features. Compared with the Target-Dependent Graph Attention Network (TD-GAT), the accuracy rates of the proposed model on the four datasets increased by 2.67 %, 1.13 %, 1.27 %, 1.84 %, respectively, and the F1 values on the five datasets increased by 2.82 %, 0.98 %, 4.56 %, 4.89 %, 4.46 %, respectively, indicating the effectiveness of using syntactic relationships and increasing keyword attention.

Key words: natural language processing, deep learning, aspect-level sentiment analysis, alternating-attention mechanism, Graph Convolutional Network (GCN)

中图分类号: